Mixtures of Conditional Maximum Entropy Models

نویسندگان

  • Dmitry Pavlov
  • Alexandrin Popescul
  • David M. Pennock
  • Lyle H. Ungar
چکیده

Driven by successes in several application areas, maximum entropy modeling has recently gained considerable popularity. We generalize the standard maximum entropy formulation of classification problems to better handle the case where complex data distributions arise from a mixture of simpler underlying (latent) distributions. We develop a theoretical framework for characterizing data as a mixture o] maximum entropy models. We formulate a maximum-likelihood interpretation of the mixture model learning, and derive a generalized EM algorithm to solve the corresponding optimization problem. We present empirical results for a number of data sets showing that modeling the data as a mixture of latent maximum entropy models gives significant improvement over the standard, single component, maximum entropy approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequence Modeling with Mixtures of Conditional Maximum Entropy Distributions

We present a novel approach to modeling sequences using mixtures of conditional maximum entropy distributions. Our method generalizes the mixture of first-order Markov models by including the “long-term” dependencies in model components. The “long-term” dependencies are represented by the frequently used in the natural language processing (NLP) domain probabilistic triggers or rules (such as “A...

متن کامل

Maximum Entropy Modeling Toolkit

The Maximum Entropy Modeling Toolkit supports parameter estimation and prediction for statistical language models in the maximum entropy framework. The maximum entropy framework provides a constructive method for obtaining the unique conditional distribution p*(y|x) that satisfies a set of linear constraints and maximizes the conditional entropy H(p|f) with respect to the empirical distribution...

متن کامل

A Preferred Definition of Conditional Rényi Entropy

The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...

متن کامل

Conditional and joint models for grapheme-to-phoneme conversion

In this work, we introduce several models for grapheme-tophoneme conversion: a conditional maximum entropy model, a joint maximum entropy n-gram model, and a joint maximum entropy n-gram model with syllabification. We examine the relative merits of conditional and joint models for this task, and find that joint models have many advantages. We show that the performance of our best model, the joi...

متن کامل

Chinese Named Entity Recognition with Conditional Probabilistic Models

This paper describes the work on Chinese named entity recognition performed by Yahoo team at the third International Chinese Language Processing Bakeoff. We used two conditional probabilistic models for this task, including conditional random fields (CRFs) and maximum entropy models. In particular, we trained two conditional random field recognizers and one maximum entropy recognizer for identi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003